239 research outputs found

    Capacity of a Class of Deterministic Relay Channels

    Full text link
    The capacity of a class of deterministic relay channels with the transmitter input X, the receiver output Y, the relay output Y_1 = f(X, Y), and a separate communication link from the relay to the receiver with capacity R_0, is shown to be C(R_0) = \max_{p(x)} \min \{I(X;Y)+R_0, I(X;Y, Y_1) \}. Thus every bit from the relay is worth exactly one bit to the receiver. Two alternative coding schemes are presented that achieve this capacity. The first scheme, ``hash-and-forward'', is based on a simple yet novel use of random binning on the space of relay outputs, while the second scheme uses the usual ``compress-and-forward''. In fact, these two schemes can be combined together to give a class of optimal coding schemes. As a corollary, this relay capacity result confirms a conjecture by Ahlswede and Han on the capacity of a channel with rate-limited state information at the decoder in the special case when the channel state is recoverable from the channel input and the output.Comment: 17 pages, submitted to IEEE Transactions on Information Theor

    State Amplification

    Full text link
    We consider the problem of transmitting data at rate R over a state dependent channel p(y|x,s) with the state information available at the sender and at the same time conveying the information about the channel state itself to the receiver. The amount of state information that can be learned at the receiver is captured by the mutual information I(S^n; Y^n) between the state sequence S^n and the channel output Y^n. The optimal tradeoff is characterized between the information transmission rate R and the state uncertainty reduction rate \Delta, when the state information is either causally or noncausally available at the sender. This result is closely related and in a sense dual to a recent study by Merhav and Shamai, which solves the problem of masking the state information from the receiver rather than conveying it.Comment: 9 pages, 4 figures, submitted to IEEE Trans. Inform. Theory, revise

    Asynchronous multiple-access channel capacity

    Get PDF
    The capacity region for the discrete memoryless multiple-access channel without time synchronization at the transmitters and receivers is shown to be the same as the known capacity region for the ordinary multiple-access channel. The proof utilizes time sharing of two optimal codes for the ordinary multiple-access channel and uses maximum likelihood decoding over shifts of the hypothesized transmitter words

    Information Entropy in Cosmology

    Full text link
    The effective evolution of an inhomogeneous cosmological model may be described in terms of spatially averaged variables. We point out that in this context, quite naturally, a measure arises which is identical to a fluid model of the `Kullback-Leibler Relative Information Entropy', expressing the distinguishability of the local inhomogeneous mass density field from its spatial average on arbitrary compact domains. We discuss the time-evolution of `effective information' and explore some implications. We conjecture that the information content of the Universe -- measured by Relative Information Entropy of a cosmological model containing dust matter -- is increasing.Comment: LateX, PRLstyle, 4 pages; to appear in PR

    Eight problems in information theory

    Get PDF
    Ahlswede R. Eight problems in information theory. In: Cover TM, ed. Open problems in communication and computation. New York [u.a.]: Springer; 1987: 39-42

    Dynamical response of the Hodgkin-Huxley model in the high-input regime

    Full text link
    The response of the Hodgkin-Huxley neuronal model subjected to stochastic uncorrelated spike trains originating from a large number of inhibitory and excitatory post-synaptic potentials is analyzed in detail. The model is examined in its three fundamental dynamical regimes: silence, bistability and repetitive firing. Its response is characterized in terms of statistical indicators (interspike-interval distributions and their first moments) as well as of dynamical indicators (autocorrelation functions and conditional entropies). In the silent regime, the coexistence of two different coherence resonances is revealed: one occurs at quite low noise and is related to the stimulation of subthreshold oscillations around the rest state; the second one (at intermediate noise variance) is associated with the regularization of the sequence of spikes emitted by the neuron. Bistability in the low noise limit can be interpreted in terms of jumping processes across barriers activated by stochastic fluctuations. In the repetitive firing regime a maximization of incoherence is observed at finite noise variance. Finally, the mechanisms responsible for spike triggering in the various regimes are clearly identified.Comment: 14 pages, 24 figures in eps, submitted to Physical Review

    A Bound on the Financial Value of Information

    Get PDF
    Abstract --It will be shown that each bit of information at most doubles the resulting wealth in the general stock market setup. This information bound on the growth of wealth is actually' attained for certain probability distributions on the market investigated by Kelly. The bound will be shown to be a special case of the result that the increase in exponential growth of wealth achieved with true knowledge of the stock market distribution F over that achieved with incorrect knowledge G is bounded above by D( FllG), the entropy of F relative to G

    Unsupervised feature selection for noisy data

    Get PDF
    Feature selection techniques are enormously applied in a variety of data analysis tasks in order to reduce the dimensionality. According to the type of learning, feature selection algorithms are categorized to: supervised or unsupervised. In unsupervised learning scenarios, selecting features is a much harder problem, due to the lack of class labels that would facilitate the search for relevant features. The selecting feature difficulty is amplified when the data is corrupted by different noises. Almost all traditional unsupervised feature selection methods are not robust against the noise in samples. These approaches do not have any explicit mechanism for detaching and isolating the noise thus they can not produce an optimal feature subset. In this article, we propose an unsupervised approach for feature selection on noisy data, called Robust Independent Feature Selection (RIFS). Specifically, we choose feature subset that contains most of the underlying information, using the same criteria as the Independent component analysis (ICA). Simultaneously, the noise is separated as an independent component. The isolation of representative noise samples is achieved using factor oblique rotation whereas noise identification is performed using factor pattern loadings. Extensive experimental results over divers real-life data sets have showed the efficiency and advantage of the proposed algorithm.We thankfully acknowledge the support of the Comision Interministerial de Ciencia y Tecnologa (CICYT) under contract No. TIN2015-65316-P which has partially funded this work.Peer ReviewedPostprint (author's final draft

    Region graph partition function expansion and approximate free energy landscapes: Theory and some numerical results

    Full text link
    Graphical models for finite-dimensional spin glasses and real-world combinatorial optimization and satisfaction problems usually have an abundant number of short loops. The cluster variation method and its extension, the region graph method, are theoretical approaches for treating the complicated short-loop-induced local correlations. For graphical models represented by non-redundant or redundant region graphs, approximate free energy landscapes are constructed in this paper through the mathematical framework of region graph partition function expansion. Several free energy functionals are obtained, each of which use a set of probability distribution functions or functionals as order parameters. These probability distribution function/functionals are required to satisfy the region graph belief-propagation equation or the region graph survey-propagation equation to ensure vanishing correction contributions of region subgraphs with dangling edges. As a simple application of the general theory, we perform region graph belief-propagation simulations on the square-lattice ferromagnetic Ising model and the Edwards-Anderson model. Considerable improvements over the conventional Bethe-Peierls approximation are achieved. Collective domains of different sizes in the disordered and frustrated square lattice are identified by the message-passing procedure. Such collective domains and the frustrations among them are responsible for the low-temperature glass-like dynamical behaviors of the system.Comment: 30 pages, 11 figures. More discussion on redundant region graphs. To be published by Journal of Statistical Physic

    Self-organization in the olfactory system: one shot odor recognition in insects

    Get PDF
    We show in a model of spiking neurons that synaptic plasticity in the mushroom bodies in combination with the general fan-in, fan-out properties of the early processing layers of the olfactory system might be sufficient to account for its efficient recognition of odors. For a large variety of initial conditions the model system consistently finds a working solution without any fine-tuning, and is, therefore, inherently robust. We demonstrate that gain control through the known feedforward inhibition of lateral horn interneurons increases the capacity of the system but is not essential for its general function. We also predict an upper limit for the number of odor classes Drosophila can discriminate based on the number and connectivity of its olfactory neurons
    • …
    corecore